Adding transmitters dramatically boosts coded-caching gains for finite file sizes

نویسندگان

  • Eleftherios Lampiris
  • Petros Elia
چکیده

In the context of coded caching in the K-user BC, our work reveals the surprising fact that having multiple (L) transmitting antennas, dramatically ameliorates the long-standing subpacketization bottleneck of coded caching by reducing the required subpacketization to approximately its Lth root, thus boosting the actual DoF by a multiplicative factor of up to L. In asymptotic terms, this reveals that as long as L scales with the theoretical caching gain, then the full cumulative (multiplexing + full caching) gains are achieved with constant subpacketization. This is the first time, in any known setting, that unbounded caching gains appear under finite file-size constraints. The achieved caching gains here are up to L times higher than any caching gains previously experienced in any singleor multiantenna fully-connected setting, thus offering a multiplicative mitigation to a subpacketization problem that was previously known to hard-bound caching gains to small constants. The proposed scheme is practical and it works for all values of K, L and all cache sizes. The scheme’s gains show in practice: e.g. for K = 100, when L = 1 the theoretical caching gain of G = 10, under the original coded caching algorithm, would have needed subpacketization S1 = (K G ) = (100 10 ) > 1013, while if extra transmitting antennas were added, the subpacketization was previously known to match or exceed S1. Now for L = 5, our scheme offers the theoretical (unconstrained) cumulative DoF dL = L + G = 5 + 10 = 15, with subpacketization SL = (K/L G/L ) = (100/5 10/5 ) = 190. The work extends to the multi-server and cache-aided IC settings, while the scheme’s performance, given subpacketization SL = (K/L G/L ) , is within a factor of 2 from the optimal linear sum-DoF.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

New Order-Optimal Decentralized Coded Caching Schemes with Good Performance in the Finite File Size Regime

Recently, a new class of random coded caching schemes have received increasing interest, as they can achieveorder-optimal memory-load tradeoff through decentralized content placement. However, most of these existing decen-tralized schemes may not provide enough coded-multicasting opportunities in the practical operating regime wherethe file size is limited. In this paper, we propose...

متن کامل

Coded Caching with Heterogeneous Cache Sizes and Link Qualities: The Two-User Case

The centralized coded caching problem is studied under heterogeneous cache sizes and channel qualities from the server to the users, focusing on the two-user case. More specifically, a server holding N files is considered to be serving two users with different cache sizes, each requesting a single file, and it is assumed that in addition to the shared common link, each user also has a private l...

متن کامل

Decentralized Coded Caching Without File Splitting

Coded caching is an effective technique to reduce the redundant traffic in wireless networks. The existing coded caching schemes require the splitting of files into a possibly large number of subfiles, i.e., they perform coded subfile caching. Keeping the files intact during the caching process would actually be appealing, broadly speaking because of its simpler implementation. However, little ...

متن کامل

Structural Properties of Uncoded Placement Optimization for Coded Delivery

A centralized coded caching scheme has been proposed by Maddah-Ali and Niesen to reduce the worst-case load of a network consisting of a server with access to N files and connected through a shared link to K users, each equipped with a cache of size M . However, this centralized coded caching scheme is not able to take advantage of a non-uniform, possibly very skewed, file popularity distributi...

متن کامل

Degrees of Freedom of Interference Networks with Transmitter-Side Caches

This paper studies cache-aided interference networks with arbitrary number of transmitters and receivers, whereby each transmitter has a cache memory of finite size. Each transmitter fills its cache memory from a content library of files in the placement phase. In the subsequent delivery phase, each receiver requests one of the library files, and the transmitters are responsible for delivering ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1802.03389  شماره 

صفحات  -

تاریخ انتشار 2018